Skip to content

Conversation

@anuraaga
Copy link
Contributor

@anuraaga anuraaga commented Jul 24, 2025

Continues openai instrumentation, adding support for async client. The types work out that the proxy entry points need to basically be duplicated but the telemetry is still mostly contained in the instrumentation API usage, and streaming handling was refactored to a sharable class.

Also passes context to logs in all cases which is required for async, and will be a bit faster in sync anyways.

/cc @codefromthecrypt

@anuraaga anuraaga requested a review from a team as a code owner July 24, 2025 08:07

public static void emitPromptLogEvents(
Logger eventLogger, ChatCompletionCreateParams request, boolean captureMessageContent) {
Context ctx,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

usually we don't use abbreviations so most code uses context and parentContext

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks - fixed it

future =
future.whenComplete(
(res, t) -> instrumenter.end(context, chatCompletionCreateParams, res, t));
return CompletableFutureWrapper.wrap(future, parentContext);
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I realized I had a big bug here, passing context instead of parentContext. Fixed it and updated test

@trask trask merged commit 8ddce65 into open-telemetry:main Jul 28, 2025
89 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants